Gradient Formulae for Nonlinear Probabilistic Constraints with Gaussian and Gaussian-Like Distributions
نویسندگان
چکیده
Probabilistic constraints represent a major model of stochastic optimization. A possible approach for solving probabilistically constrained optimization problems consists in applying nonlinear programming methods. In order to do so, one has to provide sufficiently precise approximations for values and gradients of probability functions. For linear probabilistic constraints under Gaussian distribution this can be successfully done by analytically reducing these values and gradients to values of Gaussian distribution functions and computing the latter, for instance, by Genz’ code. For nonlinear models one may fall back on the spherical-radial decomposition of Gaussian random vectors and apply, for instance, Deák’s sampling scheme for the uniform distribution on the sphere in order to compute values of corresponding probability functions. The present paper demonstrates how the same sampling scheme can be used in order to simultaneously compute gradients of these probability functions. More precisely, we prove a formula representing these gradients in the Gaussian case as a certain integral over the sphere again. Later, the result is extended to alternative distributions with an emphasis on the multivariate Student (or T-) distribution.
منابع مشابه
Novel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection
In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...
متن کاملNonlinear Evolution of Topologyof Large Scale Structure
As a statistical measure of large-scale structure of the universe, the genus number is presented in analytic form for realistic non-Gaussian distributions, i.e., weakly non-Gaussian, lognormal and chi-square distributions. These formulae are compared with the results of N-body simulation. It is shown that the weakly non-Gaussian formula describes the behavior of genus in weakly nonlinear regime...
متن کاملMinimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaus...
متن کاملEvaluation and Application of the Gaussian-Log Gaussian Spatial Model for Robust Bayesian Prediction of Tehran Air Pollution Data
Air pollution is one of the major problems of Tehran metropolis. Regarding the fact that Tehran is surrounded by Alborz Mountains from three sides, the pollution due to the cars traffic and other polluting means causes the pollutants to be trapped in the city and have no exit without appropriate wind guff. Carbon monoxide (CO) is one of the most important sources of pollution in Tehran air. The...
متن کاملGaussian Z Channel with Intersymbol Interference
In this paper, we derive a capacity inner bound for a synchronous Gaussian Z channel with intersymbol interference (ISI) under input power constraints. This is done by converting the original channel model into an n-block memoryless circular Gaussian Z channel (n-CGZC) and successively decomposing the n-block memoryless channel into a series of independent parallel channels in the frequency dom...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- SIAM Journal on Optimization
دوره 24 شماره
صفحات -
تاریخ انتشار 2014